منابع مشابه
Sparsity oracle inequalities for the Lasso
This paper studies oracle properties of !1-penalized least squares in nonparametric regression setting with random design. We show that the penalized least squares estimator satisfies sparsity oracle inequalities, i.e., bounds in terms of the number of non-zero components of the oracle vector. The results are valid even when the dimension of the model is (much) larger than the sample size and t...
متن کاملThe Dantzig selector and sparsity oracle inequalities
and λ̂ := λ̂ ∈Argmin λ∈Λ̂ε ‖λ‖l1 . In the case where f∗ := fλ∗ , λ ∗ ∈ R , Candes and Tao [Ann. Statist. 35 (2007) 2313–2351] suggested using λ̂ as an estimator of λ. They called this estimator “the Dantzig selector”. We study the properties of fλ̂ as an estimator of f∗ for regression models with random design, extending some of the results of Candes and Tao (and providing alternative proofs of thes...
متن کاملOracle Inequalities for the Lasso in the Cox Model.
We study the absolute penalized maximum partial likelihood estimator in sparse, high-dimensional Cox proportional hazards regression models where the number of time-dependent covariates can be larger than the sample size. We establish oracle inequalities based on natural extensions of the compatibility and cone invertibility factors of the Hessian matrix at the true regression coefficients. Sim...
متن کاملNon-asymptotic Oracle Inequalities for the Lasso and Group Lasso in high dimensional logistic model
We consider the problem of estimating a function f0 in logistic regression model. We propose to estimate this function f0 by a sparse approximation build as a linear combination of elements of a given dictionary of p functions. This sparse approximation is selected by the Lasso or Group Lasso procedure. In this context, we state non asymptotic oracle inequalities for Lasso and Group Lasso under...
متن کاملOracle Inequalities and Optimal Inference under Group Sparsity
We consider the problem of estimating a sparse linear regression vector β∗ under a gaussian noise model, for the purpose of both prediction and model selection. We assume that prior knowledge is available on the sparsity pattern, namely the set of variables is partitioned into prescribed groups, only few of which are relevant in the estimation process. This group sparsity assumption suggests us...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Electronic Journal of Statistics
سال: 2007
ISSN: 1935-7524
DOI: 10.1214/07-ejs008